Skip to main content

Linux Academy Hands-On Lab

Deploying an App Engine Application​

Google Cloud App Engine provides a solid, easy-to-access platform for a wide range of web apps. Its interoperability with other Google Cloud services, such as Cloud Datastore, enhances its effectiveness. In this hands-on lab, we’ll deploy an app to App Engine that allows users to enter details into a NoSQL database, Cloud Datastore, and displays the proper HTML template and CSS. The process requires that we customize the config.py file before deploying the app.

Learning Objectives​

Enable APIs and clone GitHub repository.​

  1. From the main navigation, visit the APIs & Services > Libraries page, and search for "Datastore".
  2. Select the Cloud Datastore API.
  3. If the API is not available, click Enable.
  4. Activate the Cloud Shell.
  5. When it spins up, create the necessary bucket (bucket name must be unique):
    gsutil mb -c regional -l us-east1 gs://[BUCKET_NAME]
    gsutil acl ch -u AllUsers:R gs://[BUCKET_NAME]
  6. Clone the GitHub repository:
    git clone https://github.com/linuxacademy/content-gc-essentials
  7. Change directory to the content-gc-essentials/app-engine-lab folder.

Configure config.py file.​

  1. From the Cloud Shell Editor, open config.py in the app-engine-lab folder.
  2. Change PROJECT_ID to the current project, as shown in the Cloud Shell.
  3. Change the "CLOUD_STORAGE_BUCKET" variable value to your unique bucket name.
  4. Save the file.

Deploy the app.​

  1. In the Cloud Shell, enter the following code:
    gcloud app deploy
  2. When prompted, choose the nearest region.

Test the app.​

  1. In the Cloud Shell, enter the following code:
    gcloud app browse
  2. If the browser window does not open, click the generated link.

Triggering a Cloud Function with Cloud Pub/Sub​

Cloud Functions can be triggered in two ways: through a direct HTTP request or through a background event. One of the most frequently used services for background events is Cloud Pub/Sub, Google Cloud’s platform-wide messaging service. In this pairing, Cloud Functions becomes a direct subscriber to a specific Cloud Pub/Sub topic, which allows code to be run whenever a message is received on a specific topic. In this hands-on lab, we’ll walk through the entire experience, from setup to confirmation.

Learning Objectives​

Enable required APIs.​

  1. Use the API Library to find and enable the Cloud Pub/Sub API.
  2. Use the API Library to find and enable the Cloud Functions API.

Create Pub/Sub topic.​

  1. From the main console navigation, go to Cloud Pub/Sub.
  2. Click Create a topic.
  3. In the dialog, enter a name greetings for the topic.
  4. Click Create.

Create a Cloud Function.​

  1. From the main console, go to Cloud Functions.
  2. Click Create function.
  3. Configure the function with the following values:
    • Name: la-pubsub-function
    • Trigger: Cloud Pub/Sub
    • Topic: greetings
    • Source code: Inline editor
    • Runtime: Python 3.7
  4. In the main.py field, enter the following code:
    import base64

    def greetings_pubsub(data, context):

    if 'data' in data:
    name = base64.b64decode(data['data']).decode('utf-8')
    else:
    name = 'folks'
    print('Greetings {} from Linux Academy!'.format(name))
  5. Set Function to Execute to greetings_pubsub.
  6. Click Create.

Publish message to topic from console.​

  1. Click the newly created Cloud Function name.
  2. Switch to Trigger.
  3. Click the topic link to go to the Cloud Pub/Sub topic.
  4. From the Topic page, click Publish Message.
  5. In the Message field, enter everyone around the world.
  6. Click Publish.

Confirm Cloud Function execution.​

  1. Return to the Cloud Functions dashboard.
  2. Click the Cloud Function's name.
  3. From the Cloud Function navigation, click View Logs.
  4. Locate the most recent log.
  5. Confirm function execution.

Trigger Cloud Function directly from command line.​

  1. Click Activate Cloud Shell from the top row of the console.
  2. In the Cloud Shell, enter the following code:
    DATA=$(printf 'my friends' | base64)
    gcloud functions call la-pubsub-function --data '{"data":"'$DATA'"}'
  3. After a moment, refresh the logs page and confirm the Cloud Function execution.

Publish message to topic from command line.​

  1. In the Cloud Shell, enter the following command:
    gcloud pubsub topics publish greetings --message "y'all"
  2. After a moment, refresh the log page to confirm the function has executed.

Working with Compute Engine Windows Instances​

Compute Engine VM instances can utilize a wide range of boot disks; Google Cloud currently offers well over 50 disk images to choose from. Cloud computing professionals must be adept in spinning up a variety of operating systems, including Windows server. In this hands-on lab, you’ll experience the creation of a Windows-based Compute Engine VM instance, set up an IIS server, and push your first web page live to confirm the server’s operation.

Learning Objectives​

Create a Compute Engine VM instance.​

  1. From the main navigation, choose Compute Engine > VMs.
  2. In the VM Instances area, click Create.
  3. With New VM instance chosen from the options on the left, configure your instance:
    • In the name field, provide a relevant name using hyphens, like la-windows-1.
    • Keep the suggested Region and Zone.
    • In the Boot Disk section, click Change and select Windows Server 2019 Datacenter, Server with Desktop Experience from the list; click Select.
  4. Under Firewall, choose the Allow HTTP traffic option.
  5. Click Create.

Set Windows password.​

  1. From the RDP options, click Set Windows password.
  2. In the dialog, confirm your username is correct and click Set.
  3. Copy the supplied password and click Close.

Launch RDP window.​

  1. Launch the RDP window by using one of the following methods:
    • If you're on a Windows system, click RDP.
    • If you're on a Mac using Chrome in a standard window, first install the Chrome RDP extension, and then click RDP.
    • If you're on a Mac using another browser or Incognito window, from the App Store, download and install the latest version of the Microsoft Remote Desktop app. Then, choose Download the RDP file from the RDP options and open the file.

Install IIS.​

  1. From the Windows Start menu, right-click on Windows Powershell and choose Run as administrator.
  2. In the PowerShell window, enter the following commands:
    import-module servermanager
    add-windowsfeature web-server -includeallsubfeature
    echo '<!doctype html><html><body><h1>Greetings from Linux Academy!</h1></body></html>' > C:\inetpub\wwwroot\index.html

Test your page.​

  1. From the Compute Engine VM page, click the External link for your Windows VM instance.
  2. Review the page in your browser.

Setting Cloud Storage Lifecycle Rules​

While saving an object in a Cloud Storage bucket is relatively inexpensive, there is, nonetheless, a cost. The cost varies depending on the storage class selected. Certain objects are required to be more available at first, requiring the storage class with the highest availability β€” and cost. Such objects may eventually be relegated to less available and less expensive storage classes and, even, be deleted. Management of these objects over time can be handled automatically by establishing and implementing lifecycle rules. In this hands-on lab, we'll set a variety of lifecycle rules for Google Cloud Storage buckets both from the console and the command line.

Learning Objectives​

Create a Cloud Storage bucket.​

  1. From the Google Cloud console main navigation, choose Cloud Storage.
  2. Click Create bucket.
  3. Name the bucket uniquely.
  4. In the Storage Class section, select Regional.
  5. Click Create.

Define first lifecycle rule.​

  1. From the Cloud Storage browser page, click None in the Lifecycle column for the bucket just created.
  2. Click Add rule.
  3. Under Select object conditions, set the following:
    • Age: 180
    • Storage class: Regional, Standard
  4. Click Continue.
  5. Under Select action, choose Set to Nearline.
  6. Click Continue.
  7. Click Save.

Define second lifecycle rule.​

  1. From the Cloud Storage browser page, click Enabled in the Lifecycle column.
  2. Click Add rule.
  3. Under Select object conditions, set the following:
    • Age: 365
    • Storage class: Nearline
  4. Click Continue.
  5. Under Select action, choose Set to Coldline.
  6. Click Continue.
  7. Click Save.

From command line, get lifecycle rules.​

  1. Click Activate Cloud Shell.
  2. In the Cloud Shell, enter the following code:
    gsutil lifecycle get gs://[BUCKETNAME]
  3. Review output.

Set lifecycle rule with JSON file.​

  1. Clone a repo and change to the lab's directory:
    git clone https://github.com/linuxacademy/content-gc-essentials
    cd content-gc-essentials/cloud-storage-lifecycle-lab
  2. Review file in editor.
  3. In the Cloud Shell, enter the following code:
    gsutil lifecycle set delete-after-two-years.json gs://[BUCKET_NAME]
  4. Confirm the lifecycle rule has been added in the console.

Managing Google Cloud SQL Instances​

When working with Cloud SQL on any scale, you’ll need to manage multiple instances: creating, cloning, starting, stopping, restarting, and β€” eventually β€” deleting them. This hands-on lab gives you the experience of performing all those tasks so you’ll be familiar with the steps necessary to handle Cloud SQL instances completely.

Learning Objectives​

Create instance.​

  1. From the main navigation, click Cloud SQL.
  2. Click Create Instance.
  3. Choose your database engine.
  4. Enter an instance ID.
  5. Provide a password.
  6. Click Create.

Create a database.​

  1. Select the recently created Cloud SQL instance.
  2. Choose Create database.
  3. Name the database and leave the other fields at their default values.
  4. Click Create.

Clone the instance.​

  1. From the Instance details page, choose Create clone.
  2. Enter a new name for the clone if desired.
  3. Choose Clone latest state of instance.
  4. Click Create clone.

Stop and start an instance.​

  1. Select the first instance created.
  2. Click Stop.
  3. After the instance has stopped, click Start.

Restart an instance.​

  1. Select the cloned instance.
  2. Click Restart.

Delete an instance.​

  1. Select the cloned instance.
  2. Click Delete.
  3. Enter the name of the instance in the dialog to confirm the deletion.
  4. Click Delete.

Exploring Cloud Firestore in Datastore Mode​

Data comes in all shapes, sizes, and use cases. A relational database service like Cloud SQL isn't always the answer. Cloud Datastore is a NoSQL database service, ideal for semi-structured data that needs to be highly scalable and available. Cloud Firestore is the next generation of Datastore with enhanced features, and in this hands-on lab, you'll see how to build a Firestore NoSQL database in Cloud Datastore mode for the best of both worlds.

Learning Objectives​

Enable APIs.​

  1. From the main navigation, click APIs and Libraries > Library.
  2. Search for Datastore.
  3. Click Enable.

Create the database.​

  1. From the main navigation, click Datastore.
  2. Choose Datastore Mode.
  3. Select your closest location.
  4. Click Create database.

Define entities.​

  1. For Kind, enter Flights.
  2. Click Create Entity.
  3. Click Add property for each of the following entries, of the specified type:
    • Airline: String
    • Flight Number: Integer
    • Arrival: Data and Time
    • OnTime: Boolean
  4. Click Save.
  5. Repeat steps 3 and 4 twice more with different values.
  6. For the final entry, add another property:
    • Note: Text
  7. Click Save.

Query the data.​

  1. Switch to Query by GQL.
  2. In the Query field, enter the following:
    SELECT * FROM Flights
  3. Click Run Query.
  4. In the Query field, enter the following:
    SELECT * FROM Flights WHERE OnTime = false
  5. Click Run Query.
  6. Review results.

Connecting to Cloud Bigtable with cbt​

Sometimes data is relatively straight-forward β€” there's just an overwhelming amount of it. That's exactly what Cloud Bigtable is meant for. Cloud Bigtable is a fully managed NoSQL database service designed to handle massive amounts of information. In this hands-on lab, you’ll configure database instances and clusters for Cloud Bigtable in the console and then use command line cbt commands to create a schema and populate the table with data.

Learning Objectives​

Enable API.​

  1. From the main navigation, choose APIs & Services > Library.
  2. In the search field, enter Bigtable.
  3. Select the Cloud Bigtable Admin API card.
  4. Click Enable.

Create Bigtable instance.​

  1. From the main navigation, choose Bigtable in the Storage section.
  2. Choose Create instance.
  3. Set the following fields:
    • Name: la-data-cbt
    • Instance type: Development
    • Storage type: HDD
    • Cluster Region: us-east1
    • Cluster Zone: us-east1-b
  4. Click Done and then Create.

Install and configure cbt.​

  1. Activate the Cloud Shell by clicking its icon in the top row.
  2. In the Cloud Shell, enter the following:
    gcloud components update
    gcloud components install cbt
  3. Configure cbt with the following commands:
    echo project = [PROJECT_ID] > ~/.cbtrc
    echo instance = la-data-cbt >> ~/.cbtrc

Create data table.​

  1. In the Cloud Shell, enter the following:
    cbt createtable la-table
    cbt ls

Define table structure and add data.​

  1. In the Cloud Shell, enter the following:
    cbt createfamily la-table offerings
    cbt set la-table r1 offerings:c1=labs
    cbt read la-table
  2. Review results.
  3. Enter the following:
    cbt set la-table r1 offerings:c2=courses
    cbt read la-table

Delete Bigtable instance.​

  1. From the console, select the Bigtable instance.
  2. Choose Delete instance.